Why identify plumes and blooms? Cyanobacteria blooms are one of the most significant management challenges in the Great Lakes today. Recurring blooms of varying toxicity are commonly observed in four of the Great Lakes, and the fifth, Lake Superior, has experienced intermittent nearshore blooms since 2012. The recent advent of cyanobacterial blooms in Lake Superior is disconcerting, given the highly valued, pristine water quality of the large lake. Many fear the appearance of blooms portend a very different future for Lake Superior. As a public resource, the coastal water quality of Lake Superior has tremendous economic, public health, and environmental value, and therefore, preventing cyanobacterial blooms in Lake Superior is a high-priority management challenge.
Lake Superior is a large lake, and relying on human observations of blooms restricts observations to near-shore locations. Remote sensing has the potential to catalog spatial and temporal extent of surface blooms. In this project, we are attempting to use optical imagery from Lake Superior to delineate surface plumes (sediment) and blooms (algae). It is likely that these two surface features occur at the same time (i.e a rainstorm may lead to a sediment plume from a river and subsequently an algal boom).
To train computer algorithms to detect these features in satellite images we need a training dataset. That’s where we need your help! In this exercise, we ask you to participate in identifying changes in surface conditions in the western arm of Lake Superior. All you need is a computer and your eyes.
We will be using Google Earth Engine (GEE) for this project. Instructions on how to use this software and label satellite imagery are detailed in this tutorial. Click the “Initial setup” tab to begin.
If this is your first time using GEE and this classification workflow, please follow the tutorial below to have your account and permissions setup appropriately. You should only need to do this step once.
You can also watch the first 2.5 minutes of this video to visually walk through the setup instructions. Note that the video was originally created for the “GROD workflow”, which was the foundation for the workflow here, and you may need to substitute information that is specific for this project.
3. Create two subdirectories within this new folder - one called ‘test-val’ and another called ‘labels’.
For this application, we want to be sure that the pixels that we label are definitely the label, which means, if you have a doubt about the class, don’t label it. Use the examples and descriptions below to understand the types of pixels we are looking to label in this workflow.
Firstly, please ignore the harbor area - outlined in blue with x’s below - when labeling in Tile 1:
openWater: clear, dark pixels with no cloud interference
or highly dispersed sediment.
lightNearShoreSediment: yellow and light brown areas,
usually near shore.
darkNearShoreSediment: dark brown or red-brown areas,
usually near a stream inflow.
offShoreSediment: could also be considered dispersed
sediment. Green-ish colored areas proximate to near shore sediment. It
may often look ‘swirled’ in the deeper areas of the lake.
algalBloom: this class is tricky! Technically, it is
very hard to discriminate between off shore sediment or dispersed
sediment and algal blooms in the imagery provided. In fact, it’s so
tricky, we don’t even have an example for you! Please do use this label
if you think you see a bloom.
cloud: clouds often appear as you would expect: white or
wispy. It’s totally possible that the entire mission-date you
have selected is completely cloudy.
Clouds can look green or even black, too:
Sometimes the clouds are barely discernible, but the scene looks ‘hazy’. In this case, don’t label the haze, but do try to avoid it as you label other classes.
shorelineContamination: usually dark pixels or
yellow-brown pixels that overlap with the shoreline. The purpose of this
label is for us to be able to add ‘uncertainty’ to some machine-inferred
labels if they are also proximate to pixels labeled as
shorelineContamination. Generally speaking, label things
that might be ‘confusing’ for a machine to label because the color is
similar to colors of a sediment plume. The easiest way to detect this is
to turn on the ‘Satellite’ baselayer in the top right and toggle the
‘Layer 1’ option under ‘Layers’:
Note, it’s very important that you go back to the ‘Map’ base layer when you aren’t labeling shoreline contamination, as it’s easy to mistake the ‘Satellite’ view with the satellite image that you are labeling.
other: anything else that is present in the image that
might be classified as ‘other’ or ‘unknown’. This could be strange
image-related issues like this:
Or boats traveling!
These labels, like shore contamination, help us identify points of uncertainty when segmenting images.
The purpose of this step is to walk you through how to setup a script in GEE to be able to open an image before you actually label it. You will need to do this process before each new mission-date combination you are classifying.
Go to the project Github page.
Copy the appropriate .js script.
eePlumB_validation.js.
The purpose of this step is to have you go through the full process of
classifying a few images for practice and to provide us with data to see
how similarly all our volunteers classify a range of conditions. We hope
that the examples below will help guide everyone to classify conditions
similarly, but we understand that there will be variation as there is a
level of subjectivity in this process.eePlumB.js
var openWater =... and GEE will prompt you to import
these records. Click convert to do so.
6.1 If you haven’t worked in GEE before, you will be directed to a pop up which will ask you to create your home folder. You won’t be able to change this folder name in the future, so we recommend you write your name or a nickname/email and then hit ‘continue.’
eePlumB_[YOUR INITIALS]_validation.eePlumB_[YOUR INITIALS]_[MISSION]_[DATE]
based on the selected mission-date combination you are currently working
through.
Now that you have the script running, you are ready to add labels.
Head to the next section, How to label: Part 2!
The purpose of this step is to teach you how to use our Google Earth
Engine framework to classify images. Hopefully, this section is an easy
reference if you need a refresher in the future. Note that this step
assumes you have already done How to label: Part 1 for the
current set of images (either validation or for the specific
mission-date you have been assigned).
Geometry Imports in the map area of GEE. When
hovering, you should see a list of sediment and bloom types (see image
below). If you do not, then you should revisit Part 1 and follow those
instructions carefully.
Change the category to the type of pixel that you are labeling by
hovering over openWater (it just chooses the first category
by default) and clicking the name of the category you want. In our
example, we’ll select ‘lightNearShoreSediment’.
Next, click on the map to add a point of this type.
Continue adding points in the current category by clicking the pixel on the map that you want to label. Zoom in or out and scroll (by clicking and dragging) as needed. For each image, label at least 5 pixels for each category you see. Note that you may not see all categories in a given image.
When you have finished labeling a given tile-date, move on to the
next tile by selecting the next ‘Tile-Date’ in the
eePlumB_validation.js script:
Or the ‘Next Tile’ button in the eePlumB.js script:
If you’re not sure if you’re done, you can click through each tile to
make sure there are markers. There are 5 tiles for the
eePlumB.js script.
Exit next to where it says
“Point drawing”. If you want to start adding points again, simply go
back to Step 2 and repeat.
eePlumB.js
script and need to stop at any point, GEE will save your work and you
can start labeling again by clicking on the script you named
eePlumB_[YOUR INITIALS]_[MISSION]_[DATE] in the upper left
hand corner of the GEE code interface:
If you have made a mistake, click on the hand icon at the top
left-hand corner of the map (see the red circle on the figure below).
Next, click on the point you dislike. Then, you can drag to a new
location or hit delete to remove the point. If you moved the point, hit
Exit to stop the editing session.
To resume adding points, go back to Step 2 in the previous section.
When you have completed labeling your image, it’s time to export the labels you just created. To do this:
That’s all! If you’re ready to label another image, repeat from ‘How
to label: Part 1’. See the section ‘Selecting a mission-date for the
eeplumB.js script’. You do not need to wait for the task to
complete to navigate away from this screen or to start a new script.
All mission-date pairs are listed in this Google Sheet. These mission-date pairs have been sorted randomly. As a general rule of thumb, know that Landsat 5 (‘mission’ = LS5 in the Google sheet) images are ‘more difficult’ to label because the colors are generally more murky than any other mission. We suggest starting with a LS8 or LS9 image when you start to label.
When you’ve selected a mission-date to label, put your name and
initials in the proper row when you start to label the
image. This will prevent unnecessary duplication of labeling. When
you’re finished labeling and have exported the data, add ‘yes’ to the
finished? column.
The row values for the ‘mission’ and ‘date’ column in the Google Sheet:
Are those that you enter into your `eePlumB_ [INITIALS] _ [MISSION] _ [DATE].js` script in GEE:
The last two columns of the mission-date Google Sheet lists ‘n_PR’, which stands for the number of path-rows included in the image. If n_PR = 1, this means that there may be areas of the image that are missing. It’s okay if you click through ‘tiles’ and you don’t see imagery - it might look like this, only showing the ‘Map’ basemap:
Also, you might come across mission-dates or tiles that are completely cloud-covered:
For these, just label the clouds, and don’t worry about any other classes. Still export the data by following the directions under ‘Finishing up the labeling’.
Here are some useful shortcuts if you are a keyboard rather than mouse person.